Tags: binary cross-entropy, lecture-16, multiple outputs
A multi-label classifier has 3 output nodes with sigmoid activations. The true labels are \(\vec y = (1, 0, 1)\) and the predicted probabilities are \(\vec h = (0.9, 0.2, 0.8)\).
Compute the binary cross-entropy loss. Leave your answer in terms of \(\log\).
\(-\log(0.9) - \log(0.8) - \log(0.8) = -\log(0.9) - 2\log(0.8)\).
By the binary cross-entropy formula:
Evaluating each term:
The total is \(-\log(0.9) - 2\log(0.8)\).
Tags: binary cross-entropy, lecture-16, multiple outputs
A multi-label classifier has 4 output nodes with sigmoid activations. The true labels are \(\vec y = (0, 1, 0, 1)\) and the predicted probabilities are \(\vec h = (0.3, 0.7, 0.1, 0.9)\).
Compute the binary cross-entropy loss. Leave your answer in terms of \(\log\).
\(-2\log(0.7) - 2\log(0.9)\).
By the binary cross-entropy formula:
Evaluating each term:
The total is \(-2\log(0.7) - 2\log(0.9)\).
Tags: binary cross-entropy, lecture-16, multiple outputs
A multi-label classifier has 3 output nodes with sigmoid activations. The true labels are \(\vec y = (1, 1, 0)\) and the predicted probabilities are \(\vec h = (0.8, 0.6, 0.4)\).
Compute the binary cross-entropy loss. Leave your answer in terms of \(\log\).
\(-\log(0.8) - 2\log(0.6)\).
By the binary cross-entropy formula:
Evaluating each term:
The total is \(-\log(0.8) - 2\log(0.6)\).